6 research outputs found

    Chain minors are FPT

    Full text link
    Given two finite posets P and Q, P is a chain minor of Q if there exists a partial function f from the elements of Q to the elements of P such that for every chain in P there is a chain C_Q in Q with the property that f restricted to C_Q is an isomorphism of chains. We give an algorithm to decide whether a poset P is a chain minor of o poset Q that runs in time O(|Q| log |Q|) for every fixed poset P. This solves an open problem from the monograph by Downey and Fellows [Parameterized Complexity, 1999] who asked whether the problem was fixed parameter tractable

    Continuous Monitoring of l_p Norms in Data Streams

    Get PDF
    In insertion-only streaming, one sees a sequence of indices a_1, a_2, ..., a_m in [n]. The stream defines a sequence of m frequency vectors x(1), ..., x(m) each in R^n, where x(t) is the frequency vector of items after seeing the first t indices in the stream. Much work in the streaming literature focuses on estimating some function f(x(m)). Many applications though require obtaining estimates at time t of f(x(t)), for every t in [m]. Naively this guarantee is obtained by devising an algorithm with failure probability less than 1/m, then performing a union bound over all stream updates to guarantee that all m estimates are simultaneously accurate with good probability. When f(x) is some l_p norm of x, recent works have shown that this union bound is wasteful and better space complexity is possible for the continuous monitoring problem, with the strongest known results being for p=2. In this work, we improve the state of the art for all 0<p<2, which we obtain via a novel analysis of Indyk\u27s p-stable sketch

    An Improved Lower Bound for Sparse Reconstruction from Subsampled Walsh Matrices

    Get PDF
    We give a short argument that yields a new lower bound on the number of uniformly and independently subsampled rows from a bounded, orthonormal matrix necessary to form a matrix with the restricted isometry property. We show that a matrix formed by uniformly and independently subsampling rows of an N ×N Walsh matrix contains a K-sparse vector in the kernel, unless the number of subsampled rows is Ω(KlogKlog(N/K)) — our lower bound applies whenever min(K,N/K) \u3e logC N. Containing a sparse vector in the kernel precludes not only the restricted isometry property, but more generally the application of those matrices for uniform sparse recovery

    An improved analysis of the ER-SpUD dictionary learning algorithm

    Get PDF
    In "dictionary learning" we observe Y=AX+E for some Y∈ℝn×p, A∈ℝm×n, and X∈ℝm×p. The matrix Y is observed, and A,X,E are unknown. Here E is "noise" of small norm, and X is column-wise sparse. The matrix A is referred to as a {\em dictionary}, and its columns as {\em atoms}. Then, given some small number p of samples, i.e.\ columns of Y, the goal is to learn the dictionary A up to small error, as well as X. The motivation is that in many applications data is expected to sparse when represented by atoms in the "right" dictionary A (e.g.\ images in the Haar wavelet basis), and the goal is to learn A from the data to then use it for other applications. Recently, [SWW12] proposed the dictionary learning algorithm ER-SpUD with provable guarantees when E=0 and m=n. They showed if X has independent entries with an expected s non-zeroes per column for 1≲s≲n‾√, and with non-zero entries being subgaussian, then for p≳n2log2n with high probability ER-SpUD outputs matrices A′,X′ which equal A,X up to permuting and scaling columns (resp.\ rows) of A (resp.\ X). They conjectured p≳nlogn suffices, which they showed was information theoretically necessary for {\em any} algorithm to succeed when s≃1. Significant progress was later obtained in [LV15]. We show that for a slight variant of ER-SpUD, p≳nlog(n/δ) samples suffice for successful recovery with probability 1−δ. We also show that for the unmodified ER-SpUD, p≳n1.99 samples are required even to learn A,X with polynomially small success probability. This resolves the main conjecture of [SWW12], and contradicts the main result of [LV15], which claimed that p≳nlog4n guarantees success whp.Engineering and Applied Science
    corecore